674 research outputs found

    One Thing After Another: Why the Passage of Time Is Not an Illusion

    Get PDF
    Does time seem to pass, even though it doesn’t, really? Many philosophers think the answer is ‘Yes’—at least when ‘time’s passing’ is understood in a particular way. They take time’s passing to be a process by which each time in turn acquires a special status, such as the status of being the only time that exists, or being the only time that is present. This chapter suggests that, on the contrary, all we perceive is temporal succession, one thing after another, a notion to which modern physics is not inhospitable. The contents of perception are best described in terms of ‘before’ and ‘after’, rather than ‘past’, ‘present, and ‘future’

    Spherical Slepian functions and the polar gap in geodesy

    Full text link
    The estimation of potential fields such as the gravitational or magnetic potential at the surface of a spherical planet from noisy observations taken at an altitude over an incomplete portion of the globe is a classic example of an ill-posed inverse problem. Here we show that the geodetic estimation problem has deep-seated connections to Slepian's spatiospectral localization problem on the sphere, which amounts to finding bandlimited spherical functions whose energy is optimally concentrated in some closed portion of the unit sphere. This allows us to formulate an alternative solution to the traditional damped least-squares spherical harmonic approach in geodesy, whereby the source field is now expanded in a truncated Slepian function basis set. We discuss the relative performance of both methods with regard to standard statistical measures as bias, variance and mean-square error, and pay special attention to the algorithmic efficiency of computing the Slepian functions on the region complementary to the axisymmetric polar gap characteristic of satellite surveys. The ease, speed, and accuracy of this new method makes the use of spherical Slepian functions in earth and planetary geodesy practical.Comment: 14 figures, submitted to the Geophysical Journal Internationa

    Generative discriminative models for multivariate inference and statistical mapping in medical imaging

    Full text link
    This paper presents a general framework for obtaining interpretable multivariate discriminative models that allow efficient statistical inference for neuroimage analysis. The framework, termed generative discriminative machine (GDM), augments discriminative models with a generative regularization term. We demonstrate that the proposed formulation can be optimized in closed form and in dual space, allowing efficient computation for high dimensional neuroimaging datasets. Furthermore, we provide an analytic estimation of the null distribution of the model parameters, which enables efficient statistical inference and p-value computation without the need for permutation testing. We compared the proposed method with both purely generative and discriminative learning methods in two large structural magnetic resonance imaging (sMRI) datasets of Alzheimer's disease (AD) (n=415) and Schizophrenia (n=853). Using the AD dataset, we demonstrated the ability of GDM to robustly handle confounding variations. Using Schizophrenia dataset, we demonstrated the ability of GDM to handle multi-site studies. Taken together, the results underline the potential of the proposed approach for neuroimaging analyses.Comment: To appear in MICCAI 2018 proceeding

    Moving Beyond Noninformative Priors: Why and How to Choose Weakly Informative Priors in Bayesian Analyses

    Get PDF
    Throughout the last two decades, Bayesian statistical methods have proliferated throughout ecology and evolution. Numerous previous references established both philosophical and computational guidelines for implementing Bayesian methods. However, protocols for incorporating prior information, the defining characteristic of Bayesian philosophy, are nearly nonexistent in the ecological literature. Here, I hope to encourage the use of weakly informative priors in ecology and evolution by providing a ‘consumer\u27s guide’ to weakly informative priors. The first section outlines three reasons why ecologists should abandon noninformative priors: 1) common flat priors are not always noninformative, 2) noninformative priors provide the same result as simpler frequentist methods, and 3) noninformative priors suffer from the same high type I and type M error rates as frequentist methods. The second section provides a guide for implementing informative priors, wherein I detail convenient ‘reference’ prior distributions for common statistical models (i.e. regression, ANOVA, hierarchical models). I then use simulations to visually demonstrate how informative priors influence posterior parameter estimates. With the guidelines provided here, I hope to encourage the use of weakly informative priors for Bayesian analyses in ecology. Ecologists can and should debate the appropriate form of prior information, but should consider weakly informative priors as the new ‘default’ prior for any Bayesian model

    Quality management in heavy duty manufacturing industry: TQM vs. Six Sigma

    Get PDF
    ‘Is TQM a management fad?’ This question has been extensively documented in the quality management literature; and will be tackled in this research though a critical literature review on the area. ‘TQM versus Six-Sigma’ debate, which has also been a fundamental challenge in this research filed, is addressed by a thematic and chronological review on the peer papers. To evaluate this challenge in practice, a primary research in heavy duty machinery production industry have been conducted using a case-study on, J C Bamford Excavators Ltd (JCB), the largest European construction machinery producer. The result highlights that TQM is a natural foundation to build up Six-Sigma upon; and not surprisingly the quality yield in a TQM approach complemented by Six-sigma is far higher and more stable than when TQM with no Six-Sigma focus is being put in place; thus presenting the overall finding that TQM and Six Sigma are compliments, not substitutes. The study will be concluded with an overview on quality management approaches in the heavy duty manufacturing industry to highlight the way forward for the industry

    On the combination of omics data for prediction of binary outcomes

    Full text link
    Enrichment of predictive models with new biomolecular markers is an important task in high-dimensional omic applications. Increasingly, clinical studies include several sets of such omics markers available for each patient, measuring different levels of biological variation. As a result, one of the main challenges in predictive research is the integration of different sources of omic biomarkers for the prediction of health traits. We review several approaches for the combination of omic markers in the context of binary outcome prediction, all based on double cross-validation and regularized regression models. We evaluate their performance in terms of calibration and discrimination and we compare their performance with respect to single-omic source predictions. We illustrate the methods through the analysis of two real datasets. On the one hand, we consider the combination of two fractions of proteomic mass spectrometry for the calibration of a diagnostic rule for the detection of early-stage breast cancer. On the other hand, we consider transcriptomics and metabolomics as predictors of obesity using data from the Dietary, Lifestyle, and Genetic determinants of Obesity and Metabolic syndrome (DILGOM) study, a population-based cohort, from Finland

    Case study in six sigma methadology : manufacturing quality improvement and guidence for managers

    Get PDF
    This article discusses the successful implementation of Six Sigma methodology in a high precision and critical process in the manufacture of automotive products. The Six Sigma define–measure–analyse–improve–control approach resulted in a reduction of tolerance-related problems and improved the first pass yield from 85% to 99.4%. Data were collected on all possible causes and regression analysis, hypothesis testing, Taguchi methods, classification and regression tree, etc. were used to analyse the data and draw conclusions. Implementation of Six Sigma methodology had a significant financial impact on the profitability of the company. An approximate saving of US$70,000 per annum was reported, which is in addition to the customer-facing benefits of improved quality on returns and sales. The project also had the benefit of allowing the company to learn useful messages that will guide future Six Sigma activities

    Differential expression analysis with global network adjustment

    Get PDF
    <p>Background: Large-scale chromosomal deletions or other non-specific perturbations of the transcriptome can alter the expression of hundreds or thousands of genes, and it is of biological interest to understand which genes are most profoundly affected. We present a method for predicting a gene’s expression as a function of other genes thereby accounting for the effect of transcriptional regulation that confounds the identification of genes differentially expressed relative to a regulatory network. The challenge in constructing such models is that the number of possible regulator transcripts within a global network is on the order of thousands, and the number of biological samples is typically on the order of 10. Nevertheless, there are large gene expression databases that can be used to construct networks that could be helpful in modeling transcriptional regulation in smaller experiments.</p> <p>Results: We demonstrate a type of penalized regression model that can be estimated from large gene expression databases, and then applied to smaller experiments. The ridge parameter is selected by minimizing the cross-validation error of the predictions in the independent out-sample. This tends to increase the model stability and leads to a much greater degree of parameter shrinkage, but the resulting biased estimation is mitigated by a second round of regression. Nevertheless, the proposed computationally efficient “over-shrinkage” method outperforms previously used LASSO-based techniques. In two independent datasets, we find that the median proportion of explained variability in expression is approximately 25%, and this results in a substantial increase in the signal-to-noise ratio allowing more powerful inferences on differential gene expression leading to biologically intuitive findings. We also show that a large proportion of gene dependencies are conditional on the biological state, which would be impossible with standard differential expression methods.</p> <p>Conclusions: By adjusting for the effects of the global network on individual genes, both the sensitivity and reliability of differential expression measures are greatly improved.</p&gt

    Selection of tuning parameters in bridge regression models via Bayesian information criterion

    Full text link
    We consider the bridge linear regression modeling, which can produce a sparse or non-sparse model. A crucial point in the model building process is the selection of adjusted parameters including a regularization parameter and a tuning parameter in bridge regression models. The choice of the adjusted parameters can be viewed as a model selection and evaluation problem. We propose a model selection criterion for evaluating bridge regression models in terms of Bayesian approach. This selection criterion enables us to select the adjusted parameters objectively. We investigate the effectiveness of our proposed modeling strategy through some numerical examples.Comment: 20 pages, 5 figure

    Bayesian reconstruction of the cosmological large-scale structure: methodology, inverse algorithms and numerical optimization

    Full text link
    We address the inverse problem of cosmic large-scale structure reconstruction from a Bayesian perspective. For a linear data model, a number of known and novel reconstruction schemes, which differ in terms of the underlying signal prior, data likelihood, and numerical inverse extra-regularization schemes are derived and classified. The Bayesian methodology presented in this paper tries to unify and extend the following methods: Wiener-filtering, Tikhonov regularization, Ridge regression, Maximum Entropy, and inverse regularization techniques. The inverse techniques considered here are the asymptotic regularization, the Jacobi, Steepest Descent, Newton-Raphson, Landweber-Fridman, and both linear and non-linear Krylov methods based on Fletcher-Reeves, Polak-Ribiere, and Hestenes-Stiefel Conjugate Gradients. The structures of the up-to-date highest-performing algorithms are presented, based on an operator scheme, which permits one to exploit the power of fast Fourier transforms. Using such an implementation of the generalized Wiener-filter in the novel ARGO-software package, the different numerical schemes are benchmarked with 1-, 2-, and 3-dimensional problems including structured white and Poissonian noise, data windowing and blurring effects. A novel numerical Krylov scheme is shown to be superior in terms of performance and fidelity. These fast inverse methods ultimately will enable the application of sampling techniques to explore complex joint posterior distributions. We outline how the space of the dark-matter density field, the peculiar velocity field, and the power spectrum can jointly be investigated by a Gibbs-sampling process. Such a method can be applied for the redshift distortions correction of the observed galaxies and for time-reversal reconstructions of the initial density field.Comment: 40 pages, 11 figure
    corecore